Assembling the building blocks of neural computation
Neural networks in the brain are built from myriad components working together in diverse ways. These “building blocks” – ion channels, synapses, connectivity patterns – must coordinate within a network to achieve a particular computational goal, such as pattern separation or motion detection. Despite great knowledge of how building blocks work in isolation, a significant gap remains in our knowledge of how they work together in real networks. The long-term goal of the Jeanne Lab is to bridge this gap by determining basic rules that govern how computation emerges from biophysics. We use the tiny fruit fly because it exhibits robust behavior, has a compact and stereotyped brain, and is accessible for in vivo patch-clamp electrophysiology, optogenetics, and 2-photon imaging. Neural computation is the foundation of cognition; a better understanding of the underlying mechanisms has the potential to inspire new strategies for repairing brains damaged by injury or disease.
Methods
Topics
Biography
James (Jamie) Jeanne received his Bachelor's degree in Electrical Engineering in 2005 from Princeton University and his Doctoral degree in Computational Neuroscience in 2012 from the University of California San Diego. After completing postdoctoral studies at Harvard University in 2017, he started his lab in Yale's Neuroscience Department.